Clustering in Weight Space of Feedforward Nets

نویسندگان

  • Stefan M. Rüger
  • Arnfried Ossen
چکیده

We study symmetries of feedforward networks in terms of their corresponding groups and nd that these groups naturally act on and partition weight space. We specify an algorithm to generate representative weight vectors in a speciic fundamental domain. The analysis of the metric structure of the fundamental domain enables us to use the location information of weight vector estimates, e. g. for cluster analysis. This can be implemented eeciently even for large networks.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Simplifying Neural Nets by Discovering Flat Minima

We present a new algorithm for finding low complexity networks with high generalization capability. The algorithm searches for large connected regions of so-called ''fiat'' minima of the error function. In the weight-space environment of a "flat" minimum, the error remains approximately constant. Using an MDL-based argument, flat minima can be shown to correspond to low expected overfitting. Al...

متن کامل

Generalisation in Cubic Nodes - centres and clustering First, recall the action of TLUs for comparison. The operation of an

This lecture deals with training nets of cubic nodes and introduces another major (quite general) algorithm-Reward Penalty. Insight into how we might train nets of cubic nodes is provided by considering the problems associated with generalisation in these nets. We then go on to consider feedback or recurrent nets from the point of view of their implementing iterated feedforward nets (recall thi...

متن کامل

Analog Neural Nets with Gaussian or Other Common Noise Distribution Cannot Recognize Arbitrary Regular Languages

We consider recurrent analog neural nets where the output of each gate is subject to gaussian noise or any other common noise distribution that is nonzero on a sufficiently large part of the state-space. We show that many regular languages cannot be recognized by networks of this type, and we give a precise characterization of languages that can be recognized. This result implies severe constra...

متن کامل

Analog Neural Nets with Gaussian or Other Common Noise Distributions Cannot Recognize Arbitrary Regular Languages

We consider recurrent analog neural nets where the output of each gate is subject to gaussian noise or any other common noise distribution that is nonzero on a sufficiently large part of the state-space. We show that many regular languages cannot be recognized by networks of this type, and we give a precise characterization of languages that can be recognized. This result implies severe constra...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1996